7,063 research outputs found

    Interpreting random forest models using a feature contribution method

    Get PDF
    Model interpretation is one of the key aspects of the model evaluation process. The explanation of the relationship between model variables and outputs is easy for statistical models, such as linear regressions, thanks to the availability of model parameters and their statistical significance. For “black box” models, such as random forest, this information is hidden inside the model structure. This work presents an approach for computing feature contributions for random forest classification models. It allows for the determination of the influence of each variable on the model prediction for an individual instance. Interpretation of feature contributions for two UCI benchmark datasets shows the potential of the proposed methodology. The robustness of results is demonstrated through an extensive analysis of feature contributions calculated for a large number of generated random forest models

    Effect of lamotrigine on cognition in children with epilepsy

    Get PDF
    Background: Lamotrigine does not affect cognition in healthy adult volunteers or adult patients with epilepsy, but its effect on cognition in children is uncertain.// Objective: To compare the effect of lamotrigine and placebo on cognition in children with well-controlled or mild epilepsy.// Method: In a double-blind, placebo-controlled, crossover study, 61 children with well-controlled or mild epilepsy were randomly assigned to add-on therapy with either lamotrigine followed by placebo or placebo followed by lamotrigine. Each treatment phase was 9 weeks, the crossover period 5 weeks. A neuropsychological test battery was performed during EEG monitoring at baseline and at the end of placebo and drug phases. The paired Student’ t test was used for statistical analysis for neuropsychological data (two tailed) with a p value of 0.01 considered significant. Carryover and period effect were analyzed with generalized linear modeling (SPSS 10).// Results: Forty-eight children completed the study. Seizure frequency was similar during both treatment phases. No significant difference was found in continuous performance, binary choice reaction time, verbal and nonverbal recognition, computerized visual searching task, verbal and spatial delayed recognition, and verbal and nonverbal working memory between placebo and lamotrigine treatment phase. There was no significant carryover and period effect when corrected for randomization.// Conclusion: Lamotrigine exhibits no clinically significant cognitive effects in adjunctive therapy for children with epilepsy

    Photon-number-resolution with sub-30-ps timing using multi-element superconducting nanowire single photon detectors

    Full text link
    A photon-number-resolving detector based on a four-element superconducting nanowire single photon detector is demonstrated to have sub-30-ps resolution in measuring the arrival time of individual photons. This detector can be used to characterize the photon statistics of non-pulsed light sources and to mitigate dead-time effects in high-speed photon counting applications. Furthermore, a 25% system detection efficiency at 1550 nm was demonstrated, making the detector useful for both low-flux source characterization and high-speed photon-counting and quantum communication applications. The design, fabrication and testing of this detector are described, and a comparison between the measured and theoretical performance is presented.Comment: 13 pages, 5 figure

    Air Fraction Correction Optimisation in PET Imaging of Lung Disease

    Get PDF
    Accurate quantification of radiopharmaceutical uptake from lung PET/CT is challenging due to large variations in fractions of tissue, air, blood and water. Air fraction correction (AFC) uses voxel-wise air fractions, which can be determined from the CT acquired for attenuation correction (AC). However, resolution effects can cause artefacts in either of these corrections. In this work, we hypothesise that the resolution of the CT image used for AC should match that of the intrinsic resolution of the PET scanner but should approximate the reconstructed PET image resolution for AFC. Simulations and reconstructions were performed with the Synergistic Image Reconstruction Framework (SIRF) using phantoms with inhomogeneous attenuation (mu) maps, mimicking the densities observed in lung pathologies. Poisson noise was added to the projection data prior to OSEM reconstruction. AC was performed with a smoothed mu-map, the full-width-half-maximum (FWHM) of the 3D Gaussian kernel was varied (0 - 10 mm). Post-filters were applied to the reconstructed AC images (FWHM: 0 - 8 mm). The simulated mu-map was independently convolved with another set of 3D Gaussian kernels, of varying FWHM (0 - 12 mm), for AFC. The coefficient of variation (CV) in the lung region, designed to be homogeneous post-AFC with optimised kernels, and the mean AFC-standardized uptake value (AFC-SUV) in the regions of simulated pathologies were determined. The spatial resolution of each post-filtered image was determined via a point-source insertion-and-subtraction method on noiseless data. Results showed that the CV was minimised when the kernel applied to the mu-map for AC matched that for the simulated PET scanner and the kernel applied to the mu-map for AFC matched the spatial resolution of the reconstructed PET image. This was observed for all post-reconstruction filters and supports the hypothesis. Initial results from Monte Carlo simulations validate these findings

    Practical Evaluation of Lempel-Ziv-78 and Lempel-Ziv-Welch Tries

    Full text link
    We present the first thorough practical study of the Lempel-Ziv-78 and the Lempel-Ziv-Welch computation based on trie data structures. With a careful selection of trie representations we can beat well-tuned popular trie data structures like Judy, m-Bonsai or Cedar

    The Utility of a High-intensity Exercise Protocol to Prospectively Assess ACL Injury Risk.

    Get PDF
    This study investigated the utility of a 5-min high-intensity exercise protocol (SAFT(5)) to include in prospective cohort studies investigating ACL injury risk. 15 active females were tested on 2 occasions during which their non-dominant leg was analysed before SAFT(5) (PRE), immediately after (POST0), 15 min after (POST15), and 30 min after (POST30). On the first occasion, testing included 5 maximum isokinetic contractions for eccentric and concentric hamstring and concentric quadriceps and on the second occasion, 3 trials of 2 landing tasks (i. e., single-leg hop and drop vertical jump) were conducted. Results showed a reduced eccentric hamstring peak torque at POST0, POST15 and POST30 (p<0.05) and a reduced functional HQ ratio (Hecc/Qcon) at POST15 and POST30 (p<0.05). Additionally, a more extended knee angle at POST30 (p<0.05) and increased knee internal rotation angle at POST0 and POST15 (p<0.05) were found in a single-leg hop. SAFT(5) altered landing strategies associated with increased ACL injury risk and similar to observations from match simulations. Our findings therefore support the utility of a high-intensity exercise protocol such as SAFT(5) to strengthen injury screening tests and to include in prospective cohort studies where time constraints apply

    Towards Constructing Fully Homomorphic Encryption without Ciphertext Noise from Group Theory

    Get PDF
    In CRYPTO 2008, one year earlier than Gentry\u27s pioneering \lq\lq bootstrapping\u27\u27 technique on constructing the first fully homomorphic encryption (FHE) scheme, Ostrovsky and Skeith III had suggested a completely different approach towards achieving FHE. Namely, they showed that the NAND\mathsf{NAND} operator can be realized in some \emph{non-commutative} groups; consequently, in combination with the NAND\mathsf{NAND} operator realized in such a group, homomorphically encrypting the elements of the group will yield an FHE scheme. However, no observations on how to homomorphically encrypt the group elements were presented in their paper, and there have been no follow-up studies in the literature based on their approach. The aim of this paper is to exhibit more clearly what is sufficient and what seems to be effective for constructing FHE schemes based on their approach. First, we prove that it is sufficient to find a surjective homomorphism π ⁣:G~G\pi \colon \widetilde{G} \to G between finite groups for which bit operators are realized in GG and the elements of the kernel of π\pi are indistinguishable from the general elements of G~\widetilde{G}. Secondly, we propose new methodologies to realize bit operators in some groups, which enlarges the possibility of the group GG to be used in our framework. Thirdly, we give an observation that a naive approach using matrix groups would never yield secure FHE due to an attack utilizing the \lq\lq linearity\u27\u27 of the construction. Then we propose an idea to avoid such \lq\lq linearity\u27\u27 by using combinatorial group theory, and give a prototypical but still \emph{incomplete} construction in the sense that it is \lq\lq non-compact\u27\u27 FHE, i.e., the ciphertext size is unbounded (though the ciphertexts are noise-free as opposed to the existing FHE schemes). Completely realizing FHE schemes based on our proposed framework is left as a future research topic

    A Generalization of the Goldberg-Sachs Theorem and its Consequences

    Full text link
    The Goldberg-Sachs theorem is generalized for all four-dimensional manifolds endowed with torsion-free connection compatible with the metric, the treatment includes all signatures as well as complex manifolds. It is shown that when the Weyl tensor is algebraically special severe geometric restrictions are imposed. In particular it is demonstrated that the simple self-dual eigenbivectors of the Weyl tensor generate integrable isotropic planes. Another result obtained here is that if the self-dual part of the Weyl tensor vanishes in a Ricci-flat manifold of (2,2) signature the manifold must be Calabi-Yau or symplectic and admits a solution for the source-free Einstein-Maxwell equations.Comment: 14 pages. This version matches the published on

    Quasiperiodicity and non-computability in tilings

    Full text link
    We study tilings of the plane that combine strong properties of different nature: combinatorial and algorithmic. We prove existence of a tile set that accepts only quasiperiodic and non-recursive tilings. Our construction is based on the fixed point construction; we improve this general technique and make it enforce the property of local regularity of tilings needed for quasiperiodicity. We prove also a stronger result: any effectively closed set can be recursively transformed into a tile set so that the Turing degrees of the resulted tilings consists exactly of the upper cone based on the Turing degrees of the later.Comment: v3: the version accepted to MFCS 201
    corecore